AIbase
Home
AI Tools
AI Models
MCP
AI NEWS
EN
Model Selection
Tags
Efficient inference with distilled models

# Efficient inference with distilled models

Nllb
NLLB-200 distilled version with 600M parameters, supporting machine translation tasks for 200 languages
Machine Translation Transformers Supports Multiple Languages
N
Narsil
113
2
Emotion English Distilroberta Base
A fine-tuned English text emotion classification model based on DistilRoBERTa-base, capable of predicting Ekman's six basic emotions and neutral category.
Text Classification Transformers English
E
j-hartmann
1.1M
402
Featured Recommended AI Models
AIbase
Empowering the Future, Your AI Solution Knowledge Base
English简体中文繁體中文にほんご
© 2025AIbase